Sufficiency, classification, and the class-specific feature theorem

نویسنده

  • Steven Kay
چکیده

A new proof of the class-specific feature theorem is given. The proof makes use of the observed data as opposed to the set of sufficient statistics as in the original formulation. We prove the theorem for the classical case, in which the parameter vector is deterministic and known, as well as for the Bayesian case, in which the parameter vector is modeled as a random vector with known prior probability density function. The essence of the theorem is that with a suitable normalization the probability density function of the sufficient statistic for each probability density function family can be used for optimal classification. One need not have knowledge of the probability density functions of the data under each hypothesis.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

The PDF projection theorem and the class-specific method

In this paper, we present the theoretical foundation for optimal classification using class-specific features and provide examples of its use. A new probability density function (PDF) projection theorem makes it possible to project probability density functions from a low-dimensional feature space back to the raw data space. An -ary classifier is constructed by estimating the PDFs of class-spec...

متن کامل

A Novel One Sided Feature Selection Method for Imbalanced Text Classification

The imbalance data can be seen in various areas such as text classification, credit card fraud detection, risk management, web page classification, image classification, medical diagnosis/monitoring, and biological data analysis. The classification algorithms have more tendencies to the large class and might even deal with the minority class data as the outlier data. The text data is one of t...

متن کامل

MLIFT: Enhancing Multi-label Classifier with Ensemble Feature Selection

Multi-label classification has gained significant attention during recent years, due to the increasing number of modern applications associated with multi-label data. Despite its short life, different approaches have been presented to solve the task of multi-label classification. LIFT is a multi-label classifier which utilizes a new strategy to multi-label learning by leveraging label-specific ...

متن کامل

A New Framework for Distributed Multivariate Feature Selection

Feature selection is considered as an important issue in classification domain. Selecting a good feature through maximum relevance criterion to class label and minimum redundancy among features affect improving the classification accuracy. However, most current feature selection algorithms just work with the centralized methods. In this paper, we suggest a distributed version of the mRMR featu...

متن کامل

تحلیل ممیز غیرپارامتریک بهبودیافته برای دسته‌بندی تصاویر ابرطیفی با نمونه آموزشی محدود

Feature extraction performs an important role in improving hyperspectral image classification. Compared with parametric methods, nonparametric feature extraction methods have better performance when classes have no normal distribution. Besides, these methods can extract more features than what parametric feature extraction methods do. Nonparametric feature extraction methods use nonparametric s...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:
  • IEEE Trans. Information Theory

دوره 46  شماره 

صفحات  -

تاریخ انتشار 2000